BioMed Research International
○ Wiley
Preprints posted in the last 30 days, ranked by how well they match BioMed Research International's content profile, based on 11 papers previously published here. The average preprint has a 0.10% match score for this journal, so anything above that is already an above-average fit.
Chen, C.; Zhao, Z. H.; Xu, L.; Gao, J. N.; Liu, X.; Quan, X. Q.; Zhang, Y. H.
Show abstract
Rapid prediction of the severity of acute coronary syndrome (ACS) is crucial for appropriate intervention in emergency department. Neutrophils (Neu), lymphocytes (Lym) and monocytes (Mon) and their ratios (Neu/Lym, NLR; Mon/Lym, MLR NeuxMon/Lym, SIRI) are acknowledged to be associated with the prediction of the severity and adverse outcome of ACS patients. Here, we analysed retrospectively eosinophils (Eos) and Eos-derived novel ratios (Neu/Eos, NER; Mon/Eos, MER; Neu x Mon/Eos, SIII; Neu/Eos x Lym, NEL; Mon/Eos x Lym, MEL; Neu x Mon/Eos x Lym, SV) of first admitted 1053 ACS patients within 24 hours of symptom onset to predict ST-segment elevation of myocardial infarction (STEMI), high Gensini score (H) and cardiac dysfunction (Killip Classification l to III grades). Results showed that Eos was significantly decreased in ST (n=227), Gensini (H) (n=311) and Killip I group (n=237) (P<0.05). All Eos-derived ratios (NER, MER, SIII, NEL, MEL, SV) were significantly higher with diagnostic severity (ST, Gensini (H), and Killip I group (P<0.05). ROC analysis revealed that SIII and SV predicted ST and Gensini (H) with high specificity and sensitivity, which were similar to that of NLR, MLR and SIRI. Conclusion: Eos and Eos-derived ratios, SIII and SV in particular, are strongly linked to the prediction of the severity of ACS, along with those of well-established leukocyte ratios. The new ratios of Eos hold significant importance in emergency department for quick evaluation of ACS patients.
Johnson, O. S.; Bond, C. W.; Noonan, B. C.
Show abstract
Background: Psychological readiness to return to sport and subjective knee function are critical outcomes following ACL reconstruction (ACLR), yet they do not always progress in parallel. An athlete may demonstrate high subjective knee function but low psychological readiness, suggesting a mental barrier to return, or conversely, report high readiness despite persistent functional limitations, raising concerns of overconfidence and reinjury risk. Understanding how these domains change together during recovery is essential for identifying mismatches that may require targeted intervention. Purpose: The purpose of this study is to examine the relationship between changes in psychological readiness (ACL-RSI) and subjective knee function (IKDC) from early to late recovery following ACLR. Study Design: Secondary analysis of prospectively collected data. Methods: Athletes (N = 48, Age at ACLR = 17.7 {+/-} 1.8 y) aged 15-25 years who underwent ACLR with an ipsilateral autograft, had a pre-injury MARX score > 8, and completed the ACL-RSI and IKDC questionnaires at 3.5 {+/-} 1 and 7 {+/-} 1 months post-ACLR were included. Percent changes in ACL-RSI and IKDC scores between early and late recovery were calculated. Spearman's rank correlation was used to examine the association between changes in psychological readiness and subjective knee function. Significance was set to p < .05. Results: The mean percent change in ACL-RSI was 40.7 {+/-} 57.1% and the mean percent change in IKDC was 24.8 {+/-} 18.1% from 3.5 {+/-} 1 months to 7 {+/-} 1 months post-ACLR. The percent changes in ACL-RSI and IKDC scores from 3.5 {+/-} 1 months to 7 {+/-} 1 months post-ACLR were moderately correlated ({rho} = 0.350 (95% CI [0.089, 0.584]), p = 0.012). Discussion: The main finding of this study was that subjective knee function and psychological readiness to return to sport changed in parallel from 3.5 to 7 months following ACLR. Clinicians can use this information regarding the concordant progression of psychological readiness to return to sport and subjective knee function to personalize ACL rehabilitation for future patients. Overall, clinicians can understand that if psychological readiness improves, subjective knee function will likely improve over the 3.5- to 7-month post-ACLR time frame, and vice versa. Therefore, focusing on both of these components at multiple time points during the recovery process may be influential to ensure the greatest likelihood of returning to sport in athletes following ACLR.
Andriazzi, V. H.; Curcio, R. P.; Novais, M. A. R. A.; Fernandes, B. L. G.; Rosa, G. C.; Vasconcelos, J. G. S.; Quineper, J. N.
Show abstract
ObjectiveTo compare the efficacy and safety of etomidate versus ketamine as induction agents for rapid sequence intubation in critically ill adults, focusing on 28-day mortality and post-intubation hypotension. Data SourcesPubMed, Embase, and the Cochrane Library were systematically searched from inception to January 2026. Reference lists of included studies were also manually screened. Study SelectionWe included randomized controlled trials (RCTs) comparing single-dose intravenous ketamine versus etomidate for emergency rapid sequence intubation in critically ill adults ([≥] 18 years) in non-operating room settings (e.g., intensive care unit or emergency department). Data ExtractionTwo investigators independently screened records, extracted data using a standardized form and assessed the risk of bias using the RoB 2 tool. The certainty of evidence was evaluated using the GRADE framework. Data SynthesisSix RCTs comprising 4,108 patients (2,046 assigned to ketamine and 2,062 to etomidate) were included. The pooled analysis showed no statistically significant difference in 28-day mortality between the ketamine and etomidate groups (39.0% vs. 40.3%; relative risk [RR] 0.96; 95% CI, 0.89-1.03; p=0.29; I{superscript 2}=11%). In a prespecified subgroup analysis of patients with sepsis (n=1,546), mortality also did not differ significantly (RR 0.94; 95% CI, 0.86-1.03). However, ketamine was associated with a statistically significant increase in the incidence of post-intubation hypotension (14.2% vs. 11.3%; RR 1.25; 95% CI, 1.01-1.53; p=0.04; I{superscript 2}=0%). No significant differences were observed regarding peri-intubation cardiac arrest, first-attempt intubation success, or ventilator- and intensive care unit-free days. ConclusionsThere is no statistical difference in 28-day mortality between etomidate and ketamine for emergency intubation in critically ill adults, including those with sepsis. The higher incidence of post-intubation hypotension with ketamine suggests etomidate presents a more favorable hemodynamic safety profile in this setting. Key pointsO_ST_ABSQuestionC_ST_ABSDoes the choice between etomidate and ketamine for emergency intubation in critically ill patients impact 28-day mortality? FindingsIn this systematic review and meta-analysis of randomized controlled trials, there was no statistically significant difference in 28-day mortality between patients induced with ketamine (39.0%) and those induced with etomidate (40.3%). MeaningThe use of etomidate versus ketamine for rapid sequence intubation does not alter 28-day mortality, indicating that the choice of induction agent should be individualized.
AZAK, A.; Avsar, M. G.; Kocak, G.; Koyuncuoglu, A.; Kilickesmez, K.; Basci, O. K.; Avci, E.
Show abstract
IntroductionPatients with type 2 diabetes mellitus (T2DM) are at increased risk of coronary artery disease and frequently undergo coronary angiography or percutaneous coronary intervention. Although risk factors for post-contrast acute kidney injury (PC-AKI) are well defined, effective preventive strategies remain limited. MethodsThis multicenter observational cohort study included 975 patients aged 18-75 years who underwent coronary angiography and/or percutaneous coronary intervention with iodinated contrast between June 2023 and June 2024. All patients received standardized intravenous hydration. Participants were grouped according to chronic sodium-glucose co-transporter-2 (SGLT2) inhibitor use ([≥]3 months). PC-AKI was defined as a [≥]25% or [≥]0.5 mg/dL increase in serum creatinine within 48-72 hours after contrast exposure. ResultsThe mean age was 59.2 {+/-} 11.7 years, and 70.8% were male; 16.9% were using SGLT2 inhibitors. PC-AKI occurred in 7.3% of patients, and 0.7% required renal replacement therapy. In univariate analysis, advanced age, diabetes, hypertension, heart failure, diuretic use, and elevated urea, creatinine, potassium, and uric acid levels were associated with PC-AKI. Higher eGFR, albumin, sodium levels, and SGLT2 inhibitor use were inversely associated. In multivariate analysis, age [≥]65.5 years (OR 4.53), diabetes (OR 2.49), and uric acid >6.75 mg/dL (OR 2.34) remained independent risk factors, while eGFR >81.5 mL/min/1.73 m2 (OR 0.38), sodium >137.5 mmol/L (OR 0.36), and SGLT2 inhibitor use (OR 0.09) were independently protective. ConclusionBeyond established cardioprotective and renoprotective effects, SGLT2 inhibitors may reduce the risk of PC-AKI in patients with T2DM, potentially through decreased renal oxygen consumption and attenuation of contrast-induced hypoxic injury.
Kang, C.-Q.; Chen, L.-P.; Wang, Y.-X.
Show abstract
BackgroundEarly laparoscopic cholecystectomy (ELC) is the standard treatment for acute calculous cholecystitis (ACC), but difficult laparoscopic cholecystectomy (DLC) remains a challenge. Predicting DLC and ACC severity is crucial for clinical decision-making. MethodsThis retrospective single-center study included 198 ACC patients who underwent ELC. Preoperative clinical, laboratory, and imaging data were analyzed. DLC was defined by operative time >90 min, conversion, or subtotal cholecystectomy. ACC severity was graded using TG18. Multivariate logistic regression identified independent predictors. ResultsDLC occurred in 81 (40.9%) patients; 102 (51.5%) had severe ACC. Serum cholinesterase (ChE) and CRP were independent predictors of DLC. CRP and male sex independently predicted ACC severity. Other markers (e.g., NLR, PCT) were not independently associated. ConclusionPreoperative ChE and CRP levels are reliable predictors of DLC, while CRP and male sex predict ACC severity. These findings support their use in risk stratification and surgical planning.
Cistero, B.; Monforte, V.; Camprubi-Rimblas, M.; Areny-Balaguero, A.; Campana-Duel, E.; Fernandez, A.; Casabella Pernas, A.; Nuez Zaragoza, E.; Martin, I.; Tomas, A.; Minarro, I.; Vila, M.; Cuevas, M.; Sanchez, M.; Belda, X.; Lopez Rodriguez, M.; Teles, T.; Savone, M. F.; Stable, C.; Salom Merce, P.; Guijarro Viudez, C.; Tajan, J.; Goma Fernandez, G.; Martinez, M. L.; Kramer, L.; van Amstel, R.; Diaz Santos, E.; Blanch, L.; Gene Tous, E. M.; Bos, L.; Artigas Raventos, A.; Ceccato, A.
Show abstract
Sepsis is a complex condition with a time-dependent evolution. Longitudinal biomarker dynamics could help us to better characterise sepsis. We hypothesised that the kinetics of biomarkers are associated with sepsis and with the intensity of organ dysfunction, and may have predictive capacity for patient survival. This single-centre, prospective, observational study included adult patients presenting to the Emergency Department (ED) with suspected infection. Patients were included in the study if they had a National Early Warning Score 2 (NEWS 2) of 3 or higher. Blood samples were obtained at baseline, 4hs and 24 hs. Linear mixed models were constructed to analyse the association between biomarker concentrations over time, sepsis diagnosis and organ dysfunction severity. Joint models were used to evaluate the predictive ability of individual biomarker kinetics during the first 24 hours for in-hospital mortality Of 214 screened patients, 173 patients were analysed, and 137 (79%) developed sepsis. Linear mixed models revealed time-dependent decreases in IL10 ({beta} -0.016, 95%CI -0.028 to -0.004), IL1RN ({beta} -0.014, 95%CI -0.024 to -0.004), and IL6 ({beta} -0.012, 95%CI -0.024 to 0.00). Sepsis was associated with higher IL1RN ({beta} 0.378, 95%CI 0.153-0.603), and TNFRSF1A ({beta} 0.40, 95%CI 0.21-0.58); only models evaluating IL6 showed significant interaction between sepsis and time ({beta} -0.14, 95%CI -0.028 to 0.00). SOFA correlated with elevated IL10 ({beta} 0.048, 95%CI 0.021-0.075), IL1RN ({beta} 0.044, 95%CI 0.017-0.071), CCL2 ({beta} 0.046, 95%CI 0.021-0.071), TNFRSF1A ({beta} 0.050, 95%CI 0.030-0.070), and PCT ({beta} 2.63, 95%CI 1.32-3.93); the interaction between SOFA score and time was significant only for IL6 ({beta} -0.003, 95%CI -0.005 to -0.001). Joint survival models (adjusted for age and highest SOFA) identified IL8 (HR 0.655, 95% CrI 0.582-0.728), TNFRSF1A (HR 0.505, 95% CrI 0.419-0.682), and PCT (HR 1.004, 95% CrI 1.001-1.008) as predictors. ConclusionSepsis diagnosis and severity of organ dysfunction may be associated with higher levels and kinetic values of inflammatory biomarkers such as IL1RN and TNFRSF1A. IL6 levels showed a significant association for the interaction of time with both sepsis diagnosis and SOFA score. TNFRSF1A, IL8 and PCT dynamics were found to be associated with survival and could be useful in developing prognosis tools.
Cai, L.; Hua, Y.; Lu, W.; Bing, h.; Gao, q.; Zhang, W.
Show abstract
The red cell distribution width (RDW) is a recognized prognostic marker in sepsis, yet its dynamic changes over time and their relationship with outcomes remain unexplored. This study aimed to identify distinct RDW trajectories during the early phase of sepsis and evaluate their association with mortality. We conducted a retrospective cohort study using data from the MIMIC-IV database (n=3,813) as the derivation cohort and from the First Affiliated Hospital of Kunming Medical University (n=467) for external validation. Sepsis patients with at least seven RDW measurements within the first ten days of hospitalization were included. Group-based trajectory modeling (GBTM) was employed to identify RDW trajectories. A three-trajectory model was selected based on model fit indices and clinical interpretability: Trajectory 1 (Slow-Decrease, 32.97%), Trajectory 2 (Slow-Increase, 43.30%), and Trajectory 3 (Fluctuating-Rapid Decrease, 23.73%). In the our study, Cox models adjusted for confounders revealed that, compared to Trajectory 1, Trajectory 3 was independently associated with significantly increased 30-day (HR 1.47, 95% CI 1.17-1.84) and 90-day mortality (HR 1.54, 95% CI 1.25-1.88). Conversely, Trajectory 2 was associated with the most favorable survival rates. Kaplan-Meier analysis consistently showed the highest mortality in the Trajectory 3 group. External validation confirmed the models robustness and the consistent prognostic value of the identified trajectories. We conclude that dynamic RDW trajectories, readily identifiable from routine clinical data, provide significant prognostic information beyond single-time-point measurements and can aid in the risk stratification of sepsis patients.
Moser, J. D.; Bond, C. W.; Noonan, B. C.
Show abstract
Objectives: Compare Anterior Cruciate Ligament (ACL) Return to Sport after Injury (ACL-RSI) scores over time following ACL reconstruction (ACLR) between male and female patients aged 15 to 25 years with primary ACL injuries and ACL reinjuries. Design: Retrospective cohort design. Setting: Sports physical therapy clinics. Participants: 332 patients aged 15-25 years who underwent ACLR following either primary ACL injury or ACL reinjury, either contralateral or ipsilateral graft reinjury, and had at least one observation of the ACL-RSI. Main Outcome Measures: ACL-RSI score. Results: ACL-RSI scores significantly increased over time post- ACLR (p < .001), males reported significantly higher scores compared to females (p < .001), and patients with contralateral ACL reinjury demonstrated higher scores than those with ipsilateral ACL graft reinjury (p = .006), though there was no difference in scores between patients with primary ACL injury and ACL reinjury. A significant interaction effect of sex and injury status was also observed (p = .009), generally demonstrating that females had lower psychological readiness compared to males across injury statuses. Conclusions: ACL-RSI following ACLR varies based on biological sex and time post-ACLR, though ACL reinjury, independent of the reinjured leg, does not appear to effect scores compared to primary ACL injury.
Tefera, B.; Ali, R.; Megersa, B. S.; Girma, T.; Friis, H.; Abera, M.; Belachew, T.; Olsen, M. F.; Filteau, S.; Wells, J. C.; Wibaek, R.; Yilma, D.; Nitsch, D.
Show abstract
Introduction Glomerular filtration rate (GFR) is invasive to measure. Therefore, in clinical care, estimated GFR is derived from serum levels of endogenous filtration markers such as creatinine and cystatin C. Multiple studies from high income countries showed differences between estimated glomerular filtration rate based on cystatin C (eGFRcys) and creatinine (eGFRcr). This study aimed to assess the agreement between eGFRcys and eGFRcr in Ethiopian children and identify factors influencing higher eGFRcys and eGFRcr. Method We studied 350 Ethiopian children who were part of the iABC birth cohort study. At the recent follow-up (average age 10 years), serum cystatin C and creatinine were measured. Formulas by Berg (2015) and Hoste (2014) were used to estimate eGFRcys and eGFRcr, respectively, and Bland-Altman plots assessed their agreement. The difference in eGFR (eGFRdiff) was calculated and categorized as less than -15 mL/min/1.73 m2 (higher eGFRcr), between -15 and <15 mL/min/1.73 m2 (concordant), and greater than or equal to 15 mL/min/1.73 m2 (higher eGFRcys). Multinomial logistic regression was used to identify factors associated with higher eGFRcr and higher eGFRcys. Result Estimated glomerular filtration rate (eGFR) showed significant variation based on the estimation formula used. When using formulas by Berg (2015) and Hoste (2014), the median (IQR) eGFRcys and eGFRcr were 99.4 (90.0; 114.1), and 123.2 (110.3; 143.8) mL/min/1.73 m2, respectively. Overall, we observed a poor agreement between eGFRcys and eGFRcr, with only 94 (27.6%) children having concordant results compared to 220 (64.7%) with higher eGFRcr and 26 (7.6%) with higher eGFRcys. If the eGFRcys results are considered reliable, 27.5% of the children had eGFR below 90 mL/min/1.73 m2. Conclusion There was very marked variation in the distributions of estimated eGFRs depending on which formulas for children were used. Agreement between eGFR estimated using cystatin C and creatinine was poor among Ethiopian children. Relative to eGFRcys, kidney function may be overestimated by creatinine-based equation as up to 30ml/min in Ethiopia. Ideally, a validation study with GFR measured by gold standard methods (Inlulin clearance) among children is required. However, because of its invasive nature and financial concerns, Iohexol clearance studies are recommended.
Okubo, Y.; Phu, S.; Chaplin, C.; Hicks, C.; Coleman, E.; Humburg, P.; Martinez, P. S.; Lord, S.
Show abstract
BACKGROUNDFall injuries in older adults are devastating and often caused by impaired reactive balance to unexpected trips and slips, which conventional exercise programs do not target. This study examined whether a low-dose perturbation balance training (PBT) program among older adults can improve balance recovery following trips and slips and reduce falls and fall injuries. METHODS111 older adults (65+ years) were randomised into an intervention or control group. The intervention group undertook one weekly PBT session for three weeks on the Trip and Slip Walkway, followed by three-monthly PBT booster sessions over one year, for a total of six sessions. The control group received an educational booklet. Blinded staff assessed laboratory-falls induced by a trip and slip with a safety harness at baseline and one year. Number of falls and fall injuries in daily life were collected weekly for one year. RESULTSCompared to the control group, the intervention group experienced a 26% reduction in laboratory falls at 12 months (RR = 0.74; 95% CI: 0.54, 0.99; P = .040) but not different in number of falls, trip-and slip-encounters in daily life. However, fall-related injuries were reduced by 57% (rate ratio = 0.43; 95% CI: 0.19, 0.94, P = .024) over one year. A reduction in falls occurred within the first three months, with greater benefit among participants who completed at least three training sessions. CONCLUSIONSA low-dose PBT program can improve reactive balance over 12 months and reduced injurious falls by 57%, with benefits likely due to enhanced reactive balance rather than proactive gait strategies. Older adults may require at least three sessions to achieve meaningful fall reduction, with periodic booster sessions to sustain benefits. Incorporating PBT into exercise programs may enhance their efficacy in preventing falls and fall injuries in daily life. Key PointsA low-dose perturbation-based training program (six sessions over 12 months) improved reactive balance at 12 months and reduced injurious falls by 57%. Benefits are likely due to task-specific improvements in reactive balance against trips and slips rather than proactive gait strategies or other risk factors. Incorporating PBT into exercise programs may improve their efficacy in preventing falls and fall injuries in daily life. Why does this paper matter?Falls are the leading cause of injury-related hospitalization and loss of independence in older adults. By targeting reactive balance--an ability neglected by conventional exercise programs--it offers a novel, evidence-based approach to enhance fall prevention and reduce injuries.
Sun, Y.; Pan, Z.; Sun, J.; Sun, Y.; Wang, W.; Liang, M.; Zhang, A.; Wu, Q.; Sheng, H.; Yang, J.
Show abstract
BackgroundSevere Fever with Thrombocytopenia Syndrome (SFTS) is an acute infectious disease with high mortality. This study aimed to develop a quantitative scoring system for grading SFTS severity using dynamic clinical data. MethodsA retrospective study included 547 confirmed SFTS patients from two hospitals. Clinical data were collected over a 14-day course (divided into four phases). Patients were grouped into survivors (n=451) and non-survivors (n=96). Statistical analyses, including Kaplan-Meier curves and log-rank tests, were performed. An external validation cohort of 44 new patients was used to validate the scoring system via C-statistic, calibration curves, and decision curve analysis (DCA). ResultsOf 547 patients, 96 (17.55%) were non-survivors. Multivariate logistic regression identified six independent prognostic factors across phases: age, platelet (PLT), aspartate aminotransferase (AST), and creatinine (Cr) (days 5-7); age, red blood cell distribution width (RDW), Cr, and lactate dehydrogenase (LDH) (days 8-10); Cr and LDH (days 11-14). A scoring system (0-11 points) was developed, stratifying patients into low (0-3), intermediate (4-7), and high (8-11) risk groups, with adverse outcome rates of 1.04%, 22.92%, and 76.04%, respectively. Kaplan-Meier curves showed significant prognostic differences (log-rank P<0.001). External validation (44 cases) confirmed excellent performance: AUC 0.810-0.952, good calibration (Hosmer-Lemeshow P>0.05), and net clinical benefit (DCA Eavg 0.068-0.098, Emax 0.422-0.559). ConclusionA dynamic SFTS severity scoring system was developed and validated. Internal and external validation confirmed its reliability and clinical utility, providing a simple, practical tool for timely assessment and early intervention.
Babir, F. J.; Marcotte-Chenard, A.; Sandilands, R. E.; Falkenhain, K.; Mulkewich, N.; Islam, H.; McCarthy, S. F.; Richards, D. L.; Madden, K.; Singer, J.; Riddell, M. C.; Jung, M. E.; Gibala, M. J.; Little, J. P.
Show abstract
Aims/hypothesisTo investigate the feasibility and preliminary efficacy of a 12-week remotely-delivered exercise snacks (ES) intervention in adults with type 2 diabetes. MethodsInsufficiently active adults with type 2 diabetes (N=69; 46 females; mean age {+/-} SD: 58{+/-}11 years) were randomized to an ES or mobility/stretching comparator group (CON), which involved 4 x 1-min bouts of either vigorous or low intensity exercise, respectively, on [≥]5 days/week. The primary outcome was feasibility based on adherence. Secondary outcomes included exercise enjoyment (1-7 scale), rating of perceived exertion (RPE; 0-10 scale), heart rate (HR), hemoglobin A1c (HbA1c), blood biomarkers of cardiometabolic health, 30-second sit-to-stand capacity, grip strength, estimated maximal oxygen uptake, and anthropometrics. ResultsWeekly adherence (estimated marginal mean [95% confidence interval]: 18 bouts [16 to 21] for both groups; P=0.99) and total enjoyment (ES: 4.5 [4.1 to 4.8] vs CON: 4.3 [4.0 to 4.7]; P=0.64) were high and not different between groups. Despite higher RPE (5.7 [5.4 to 6.1]) and peak HR (73 [70 to 77] % of age-predicted HR maximum) in ES vs CON (2.0 [1.7 to 2.4] and 61 [58 to 64] % of age-predicted HR maximum, respectively) (all P<0.001), there were no between-group differences in the change in any secondary outcome (all P>0.05) except for greater sit-to-stand capacity in ES after training (between-group effect estimate [95% confidence interval]: 1.9 repetitions [0.3 to 3.4]; P=0.02). Conclusions/interpretationExercise snacks were feasible to perform in the real-world and improved physical capacity to a greater extent than CON in adults living with type 2 diabetes. Trial registrationClinicalTrials.gov ID: NCT06407245 Research in ContextO_ST_ABSWhat is already known about this subject?C_ST_ABSO_LIExercise snacks ([≤]1-min bouts of vigorous exercise spaced out across the day) are a time-efficient and practical approach to promote vigorous exercise and break up sedentary time. C_LIO_LIReal-world exercise snack interventions appear feasible for middle-aged and older adults. C_LI What is the key question?O_LIAre 12 weeks of exercise snacks performed in the real-world feasible for insufficiently active adults living with non-insulin treated type 2 diabetes? C_LI What are the new findings?O_LIExercise snacks are feasible for those living with type 2 diabetes to perform unsupervised in the real-world based on high adherence, enjoyment, and participant retention rates. C_LIO_LIExercise snacks improved 30-second sit-to-stand capacity and reduced waist circumference suggesting enhancements in physical capacity and body composition. C_LI How might this impact on clinical practice in the foreseeable future?O_LIExercise snacks could be utilized to help individuals living with type 2 diabetes build a routine or habit of incorporating small amounts of physical activity into their daily lives. C_LIO_LIThe improved physical capacity observed in the current study could contribute to lower fall risk and greater lower body strength in those with type 2 diabetes as they age. C_LI
Daoust, R.; Williamson, D.; Arbour, C.; Perry, J. J.; Berthelot, S.; Huard, V.; Archambault, P.; Emond, M.; Rouleau, D.; Morris, J.; Lessard, J.; Kochoedo, M.; Cournoyer, A.
Show abstract
IntroductionRecent evidence has shown that vitamin C has analgesic properties in immediate postoperative context. However, while a clinical trial is currently underway to evaluate vitamin C for reducing opioid consumption in acute musculoskeletal (MSK) injuries emergency department (ED) patients, its direct analgesic effect in this population has not yet been established. This pilot study evaluated the feasibility of conducting a randomized placebo-controlled trial to determine the analgesic effect of vitamin C alone compared with placebo in acute MSK injured ED patients. MethodsWe conducted a double-blind, randomized controlled pilot trial stratified by fracture status in a tertiary care center. Adults ([≥]18 years) presenting to the ED with MSK injuries of [≤] 48 hours duration and pain intensity >3/10 were randomized to receive vitamin C 900 mg twice daily for three days or placebo. Participants completed a six-day diary (electronic or paper) and were contacted on day six to document analgesic use, treatment adherence, and pain intensity. ResultsOverall, 147 patients were screened; 63 (42.9%) were excluded, 24 (16.4%) refused, leaving 60 (41.1%) participants, with a consent rate of 13.0/month. Mean age (SD) was 41.8 years (14.23) and 50% were female. Lost to follow-up rate differed between participants with electronic diary (n=7; 16.7%) and participants with paper diary (n=10; 55.6%). Patients compliance with treatment was 97.6%. The least-squares mean difference between group A and group B in the time-weighted sum of pain intensity differences over 72 hours (SPID72) was 348.7 (95% confidence interval [CI]:-698.9 to 1396.4) for the intention-to-treat analysis and 357.6 (95%CI:-709.67 to 1424.82) for the per-protocol analysis. ConclusionThis pilot study supports the feasibility of a larger randomized controlled trial on the analgesic properties of vitamin C for acute MSK injured ED patients. Strategies to reduce the missed patients and lost to follow-up rates are proposed. Trial registration numberNCT06306183, ClinicalTrials.gov
Meng, G.; Chen, Y.; Dai, M.; Tang, S.; Chen, Q.
Show abstract
AbstractsO_ST_ABSBackgroundC_ST_ABSSelf-management is essential for stroke survivors to maintain a healthy lifestyle and reduce recurrence risk. Although theory-based self-management interventions are widely recommended, the theoretical frameworks underpinning them and their comparative effectiveness remain unclear. AimsTo systematically identify the theories, models, and frameworks (TMFs) used in self-management interventions for stroke survivors, to explore how they guide interventions, and evaluate their effectiveness on self-management behaviors and self-efficacy. MethodsPubMed, Embase, Web of Science, ProQuest Health & Medical Collection and the Cochrane Library were searched from inception to July 15, 2025. Randomized controlled trials or quasi-experimental studies evaluating theory-based self-management interventions for stroke survivors were included. Two reviewers independently screened studies, extracted data, and assessed risk of bias (Cochrane RoB 2.0). Meta-analyses were performed using random-effects models. ResultsFrom 11,495 records, 32 studies with 3,212 participants were included. Sixteen distinct TMFs were identified; self-efficacy theory was most frequent (13/32), followed by social cognitive theory (6/32). All TMFs were middle-range theories. Meta-analysis showed TMFs-based interventions significantly improved self-management behaviors (SMD = 4.26, 95%CI: 0.20-8.31, I{superscript 2} = 98.2%) and self-efficacy (SMD = 0.60, 95%CI: 0.32-0.88, I{superscript 2} = 72.8%). However, the effect for behaviors is likely inflated due to extreme heterogeneity and theoretical diversity. Theory-specific analysis of self-efficacy theory (k = 8) confirmed significant effects on self-efficacy (SMD = 0.64, 95%CI: 0.21-1.08). ConclusionsThis review identified 16 distinct theoretical models; self-efficacy theory was most frequently applied, followed by social cognitive theory. Theory-based interventions significantly improved self-management behaviours and self-efficacy.
Johnson, L. R.; Bond, C. W.; Noonan, B. C.
Show abstract
Background: Quadriceps weakness may reduce sagittal plane shock absorption during landing, shifting load toward the frontal plane and increasing knee abduction moment (KAM), a biomechanical risk factor for anterior cruciate ligament (ACL) injuries. Purpose: The purpose of this study was to evaluate the association between isokinetic quadriceps strength and peak KAM during drop vertical jump landing in adolescent athletes. Study Design: Secondary analysis of previously collected data. Methods: Healthy adolescent athletes completed quadriceps strength testing using an isokinetic dynamometer and a biomechanical assessment during a drop vertical jump task. Quadriceps strength was quantified as peak concentric torque and the peak external KAM was calculated during the landing phase on the dominant limb. Both strength and KAM were normalized to body mass. Linear regression was used to examine the association between normalized quadriceps strength and peak external KAM on the dominant limb. Results: The association between quadriceps strength and peak normalized KAM on the dominant limb was not statistically significant ({beta} = -0.053 (95% CI [-0.137 to 0.030]), F(1,119) = 1.62, R2 = 0.013, p = 0.206). Quadriceps strength explained only 1.3% of the variance in peak KAM, indicating a negligible association between these variables in this cohort. Discussion: Quadriceps strength was not associated with peak normalized KAM during landing, suggesting that frontal-plane knee loading during a drop vertical jump is not meaningfully explained by maximal concentric quadriceps strength alone. KAM appears to be driven more by multi-joint movement strategy and neuromuscular coordination than by the capacity of a single muscle group.
Luo, X.; Huang, H.; Xu, S.; Li, G.; Zhang, Y.; Luo, Y.; Kong, Q.; Liu, C.; Xie, Y.; Deng, G.; Wang, Y.; Ao, D.; Lan, L.; Yu, Y.; Tang, Z.; Wang, W.
Show abstract
BackgroundSuccessful recanalisation without functional independence is a frequent phenomenon following endovascular thrombectomy for large vessel occlusion stroke. AimTo demonstrate safety and efficacy of adjunct tirofiban therapy after endovascular thrombectomy in patients with anterior circulation large vessel occlusion stroke achieving successful recanalization defined as modified Thrombolysis In Cerebral Infarction (mTICI) 2b-3. DesignThe study of adjunct tirofiban treatment after successful endovascular thrombectomy recanalisation (ATTRACTION) is a multicenter, prospective, double-blind, randomized trial enrolling 1360 patients in China. Eligible patients will be randomised 1:1 to either the tirofiban or placebo group. OutcomeThe primary efficacy outcomes is assessed as the proportion of participants with a modified Rankin Scale (mRS) score of 0-2 at 90 days, and the primary safety outcome is symptomatic intracranial haemorrhage within 48 hours from randomisation. ConclusionThis study will provide evidence on the efficacy and safety of sequential tirofiban therapy after successful recanalisation in patients with anterior circulation large vessel occlusion stroke. Trial registration numberNCT06265051 WHAT IS ALREADY KNOWN ON THIS TOPICSuccessful recanalization without functional independence is a frequent phenomenon following endovascular thrombectomy and previous small-sample, retrospective studies supported the administration of adjunct tirofiban therapy in patients after endovascular thrombectomy achieving successful recanalization. WHAT THIS STUDY ADDSThe ATTRACTION trial aims to access the efficacy and safety of adjunct tirofiban therapy and the protocol describes the rationale and design of the trial. HOW THIS STUDY MIGHT AFFECT RESEARCH, PRACTICE OR POLICYATTRACTION trial will inform whether tirofiban therapy after successful recanalisation by endovascular thrombectomy can improve patient outcomes.
Sakoda, S.; Kajiwara, K.; Shuto, R.; Kumagae, H.; Yokoi, O.; Kawano, K.
Show abstract
ContextClinical assessments of landing mechanics often require complex scoring systems or laboratory-based motion analysis, which can limit feasibility in routine practice. A visually based landing-mechanics score centered on a standardized optimal joint-alignment configuration ("Zero Position") may offer a simple, clinically deployable alternative. ObjectiveTo determine the intra- and inter-rater reliability of a landing mechanics score based on standardized optimal joint alignment at the moment of maximal center-of-mass (COM) descent. DesignCross-sectional reliability study. SettingUniversity athletic training facility. Patients or Other ParticipantsNinety healthy male collegiate athletes. Main Outcome MeasuresLanding mechanics were evaluated using frontal- and sagittal-plane video recordings, with scoring performed on the frame corresponding to maximal COM descent. Five criteria reflecting the standardized joint configuration ("Zero Position") were assessed. Intra- and inter-rater reliability were calculated using Cohens kappa coefficients and Kendalls W. ResultsAll five criteria demonstrated moderate to substantial intra-rater reliability and moderate to almost perfect inter-rater reliability. The total landing-mechanics score showed excellent agreement across all comparisons. The scoring system required minimal training and was feasible to implement using standard video recordings. ConclusionsThe landing-mechanics score centered on the Zero Position demonstrated high reliability and strong clinical feasibility. This simple, visually grounded assessment may support routine clinical screening, injury-risk evaluation, and return-to-sport decision-making. Future research should examine its applicability to single-leg landings and sport-specific high-risk movements.
Hatakeyama, S.; Hirose, Y.; Akashi, Y.; Kusama, T.; Ishimaru, N.; Morimoto, E.; Iwashima, S.; Suzuki, K.; Enomoto, K.; Suzuki, S.; Sekine, M.; Nishimura, T.; Terada, N.; Takahashi-Igari, M.; Abe, M.; Yamada, K.; Kato, D.; Ohkusu, K.; Suzuki, H.
Show abstract
The rapid diagnosis of Campylobacter infections is important for the management of infectious gastroenteritis. Although stool culture is considered the gold standard, its sensitivity is limited and it requires prolonged incubation times. We performed a prospective multicenter study at nine healthcare facilities in Japan to evaluate a Campylobacter rapid antigen test using stool specimens between March 2024 and August 2025. Patients with suspected infectious gastroenteritis were consecutively enrolled and tested using QuickNavi-Campylobacter and compared with the FilmArray Gastrointestinal Panel as the reference method. Discordant results were further evaluated by culturing and additional PCR assays. In total, 410 patients were included in the final analysis. The positive, negative, and total concordance rates between QuickNavi-Campylobacter and FilmArray Gastrointestinal Panel were 79%, 99%, and 93%, respectively. The positive concordance rate decreased in specimens collected [≥] 6 days after the onset of symptoms (50%). QuickNavi-Campylobacter demonstrated relatively good concordance with the FilmArray Gastrointestinal Panel in a real-world multicenter setting. These results suggest that this rapid antigen test may be particularly useful for the early diagnosis of suspected campylobacteriosis.
Rakhshanda, S.; Jonnagaddala, J.; Liaw, S.-T.; Rhee, J.; Rye, K.-A.
Show abstract
The objective of this systematic review and meta-analysis was to identify the interventions used to manage intolerance in patients receiving statins for primary prevention of CVD and to determine the effectiveness of these interventions. This study was conducted according to the PRISMA checklist. The electronic databases MEDLINE (PubMed), SCOPUS, EMBASE, and CINAHL were searched for studies published until June 2025. Based on the NLA definition of statin intolerance, the outcomes were split into adverse effects caused by statins and statin discontinuation. In total, 1,238 studies were identified and screened. Nine studies were eligible for systematic review, and six studies were eligible for meta-analysis. The identified intervention strategies were adjuvant therapy, statin titration, replacing statins with other lipid-lowering agents and switching to different statin. The meta-analysis showed that the pooled risk ratio (RR) relative to control was 0.97 (95% CI, 0.86-1.08) in randomized controlled trials and 0.94 (95% CI, 0.63-1.42) in overall, with point estimates in favour of intervention arms. Moderate to substantial heterogeneity was observed, with I2 between 27% to 57%. Due to the smaller number of studies, no clear conclusions can be drawn regarding how the implemented interventions may affect statin discontinuation. This study showed no strong evidence that the implemented interventions reduced statin intolerance. PROSPERO registration numberCRD42024587573 HighlightsThis study found that the intervention strategies used to manage intolerance in patients receiving statins for the primary prevention of cardiovascular diseases were adjuvant therapy, statin titration, replacing statins with other lipid-lowering agents and switching to different statin. O_LIThis study showed no strong evidence that the implemented interventions reduced statin intolerance C_LIO_LIDue to the smaller number of studies, no clear conclusions can be drawn regarding how the implemented interventions may affect statin discontinuation C_LI
Nasire, R.; Nasir, A.; Puca, D.; Charles, K.; Richman, M.; Foster, D.
Show abstract
This study explores the influence of social determinants of health (SDOH) on follow-up behavior among patients referred to community-based organizations (CBOs) in the Emergency Department (ED) of Long Island Jewish (LIJ) Medical Center. A retrospective analysis was conducted on data collected from 342 patients who were screened for SDOH between February and July 2023. Descriptive statistics and Chi-squared tests were used to identify potential associations between demographic and social factors (race, language, age, gender, employment status, and insurance status) and follow-up rates. The results revealed several trends: non-White patients (73.2%) and non-English speakers (81.8%) followed up more frequently than their counterparts, as did older adults (80.0%) and insured patients (77.8%). However, none of the variables reached statistical significance (all p-values > 0.05). The findings suggest that while demographic and social factors may influence follow-up behavior, the lack of statistical significance could be attributed to the limited sample size. These trends align with previous literature on SDOH and follow-up behavior, highlighting the need for further research with larger, more representative samples. Addressing the complex interplay of SDOH, including factors such as language, insurance, and cultural differences, is crucial for improving follow-up rates and ensuring better health outcomes for underserved populations. Future research should focus on refining referral systems, exploring additional socioeconomic factors, and conducting longitudinal studies to develop more effective strategies for integrating SDOH interventions in healthcare systems.